|
In information theory, given an unknown stationary source π with alphabet ''A'', and a sample ''w'' from π, the Krichevsky–Trofimov (KT) estimator produces an estimate πi(''w'') of the probabilities of each symbol ''i'' ∈ ''A''. This estimator is optimal in the sense that it minimizes the worst-case regret asymptotically. For a binary alphabet, and a string ''w'' with ''m'' zeroes and ''n'' ones, the KT estimator can be defined recursively〔Krichevsky, R.E. and Trofimov V.K. (1981), 'The Performance of Universal Encoding', IEEE Trans. Information Theory, Vol. IT-27, No. 2, pp. 199–207〕 as: : ==See also== * Rule of succession * Dirichlet-multinomial distribution 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Krichevsky–Trofimov estimator」の詳細全文を読む スポンサード リンク
|